MCP server for the CI-1T prediction stability engine. 20 tools + 1 resource. Evaluate model stability, probe any LLM for instability (BYOM - bring your own model), manage fleet sessions, detect drift, generate visualizations, and control API keys. 7 tools work locally with no auth or credits. 1,000 free credits on signup.
Overview
20 Tools + 1 Resource
Evaluation & Scoring
- evaluate: Prediction stability scoring (floats or Q0.16 fixed-point)
- fleet_evaluate: Multi-node fleet-wide evaluation
- probe: Probe any LLM for instability (3x same prompt). BYOM: bring your own model via any OpenAI-compatible API
- health: Engine status check
Fleet Sessions
- fleet_session_create / round / state / list / delete: Persistent multi-model comparison sessions
Account Management
- list_api_keys / create_api_key / delete_api_key: API key CRUD
- get_invoices: Billing history (Stripe)
Local Tools (no auth, no credits)
- interpret_scores: Statistical breakdown (mean, std, min/max, normalized)
- convert_scores: Float to Q0.16 conversion
- generate_config: Integration boilerplate for FastAPI, Express, etc.
- compare_windows: Drift detection between baseline and recent episodes
- alert_check: Custom threshold alerts with severity levels
- visualize: Interactive HTML charts
- onboarding: Setup guide
Quick Start
- Create a free account at collapseindex.org (1,000 free credits)
- Copy your API key from the dashboard
- Add the server config to Claude Desktop, Cursor, or VS Code
- Ask your agent: "Help me get started with CI-1T"
BYOM Probe
Probe any model, local or remote, without using credits:
- Ollama: base_url = http://localhost:11434/v1, model = llama3
- OpenAI: base_url = https://api.openai.com/v1, model = gpt-4o
- Any OpenAI-compatible endpoint works
Docker: hub.docker.com/r/collapseindex/ci1t-mcp npm: @collapseindex/ci1t-mcp
Server Config
{
"mcpServers": {
"ci1t": {
"command": "docker",
"args": [
"run",
"-i",
"--rm",
"-e",
"CI1T_API_KEY",
"collapseindex/ci1t-mcp"
],
"env": {
"CI1T_API_KEY": "ci_your_key_here"
}
}
}
}